The jump set under geometric regularisation. Part 2: Higher-order approaches
نویسنده
چکیده
In Part 1, we developed a new technique based on Lipschitz pushforwards for proving the jump set containment property H(Ju \ Jf ) = 0 of solutions u to total variation denoising. We demonstrated that the technique also applies to Huber-regularised TV. Now, in this Part 2, we extend the technique to higher-order regularisers. We are not quite able to prove the property for total generalised variation (TGV) based on the symmetrised gradient for the second-order term. We show that the property holds under three conditions: First, the solution u is locally bounded. Second, the second-order variable is of locally bounded variation, w ∈ BVloc(Ω;R), instead of just bounded deformation, w ∈ BD(Ω). Third, w does not jump on Ju parallel to it. The second condition can be achieved for non-symmetric TGV. Both the second and third condition can be achieved if we change the Radon (or L1) norm of the symmetrised gradient Ew into an L norm, p > 1, in which case Korn’s inequality holds. We also consider the application of the technique to infimal convolution TV, and study the limiting behaviour of the singular part of Du, as the second parameter of TGV2 goes to zero. Unsurprisingly, it vanishes, but in numerical discretisations the situation looks quite different. Finally, our work additionally includes a result on TGV-strict approximation in BV(Ω). Mathematics subject classification: 26B30, 49Q20, 65J20.
منابع مشابه
The jump set under geometric regularisation. Part 1: Basic technique and first-order denoising
Let u ∈ BV(Ω) solve the total variation denoising problem with L-squared fidelity and data f . Caselles et al. [Multiscale Model. Simul. 6 (2008), 879–894] have shown the containment H(Ju \Jf ) = 0 of the jump set Ju of u in that of f . Their proof unfortunately depends heavily on the co-area formula, as do many results in this area, and as such is not directly extensible to higher-order, curva...
متن کاملThe Jump Set under Geometric Regularization. Part 1: Basic Technique and First-Order Denoising
Abstract. Let u ∈ BV(Ω) solve the total variation denoising problem with L2-squared fidelity and data f . Caselles et al. [Multiscale Model. Simul. 6 (2008), 879–894] have shown the containment Hm−1(Ju \ Jf ) = 0 of the jump set Ju of u in that of f . Their proof unfortunately depends heavily on the co-area formula, as do many results in this area, and as such is not directly extensible to high...
متن کاملStrong polyhedral approximation of simple jump sets
We prove a strong approximation result for functions u ∈ W 1,∞(Ω \ J), where J is the union of finitely many Lipschitz graphs satisfying some further technical assumptions. We approximate J by a polyhedral set in such a manner that a regularisation term η(Div u), (i = 0, 1, 2, . . .), is convergent. The boundedness of this regularisation functional itself, introduced in [T. Valkonen: “Transport...
متن کاملRobust Full Bayesian Methods for Neural Networks
Arnaud Doucet Cambridge University Engineering Department Cambridge CB2 1PZ England [email protected] In this paper, we propose a full Bayesian model for neural networks. This model treats the model dimension (number of neurons), model parameters, regularisation parameters and noise parameters as random variables that need to be estimated. We then propose a reversible jump Markov chain Monte Ca...
متن کاملLogarithmic Opinion Pools for Conditional Random Fields
Since their recent introduction, conditional random fields (CRFs) have been successfully applied to a multitude of structured labelling tasks in many different domains. Examples include natural language processing (NLP), bioinformatics and computer vision. Within NLP itself we have seen many different application areas, like named entity recognition, shallow parsing, information extraction from...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1407.2334 شماره
صفحات -
تاریخ انتشار 2014